-
Notifications
You must be signed in to change notification settings - Fork 101
Add example for a job writing to a Unity Catalog volume #51
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| @@ -0,0 +1,60 @@ | |||
| { | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should render fine in the Github repo files. For example: https://github.com/databricks/cli/blob/main/internal/testdata/notebooks/py1.ipynb
pietern
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
knowledge_base/save_job_result_to_volume/resources/my_volume.volume.yml
Outdated
Show resolved
Hide resolved
knowledge_base/save_job_result_to_volume/resources/top_ten_trips.job.yml
Outdated
Show resolved
Hide resolved
knowledge_base/save_job_result_to_volume/resources/top_ten_trips.job.yml
Outdated
Show resolved
Hide resolved
| name: my_volume | ||
| # We use the ${resources.schemas...} interpolation syntax to force the creation | ||
| # of the schema before the volume. Usage of the ${resources.schemas...} syntax | ||
| # allows Databricks Asset Bundles to form a dependency graph between resources. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we need to go into this here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We have had multiple SAs reach out to us and ask how to sequence the resource creation. Given that folks will often use schemas with their volumes it feels relevant to keep this bit here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks. Linking the warning PR for posterity: databricks/cli#1989
| "metadata": {}, | ||
| "outputs": [], | ||
| "source": [ | ||
| "file_path = dbutils.widgets.get(\"file_path\")\n", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This doesn't actually work does it? Without a dbutils.widgets.text() and/or widgets section in the ipynb JSON below
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
knowledge_base/write_from_job_to_volume/resources/hello_world.job.yml
Outdated
Show resolved
Hide resolved
|
Hi @shreyas-goenka and @pietern : Does DAB have a similar capability to the notebook SQL cmd CREAT SCHEAM IF NOT EXIST ? Could you please confirm on this ?, our deployment is failing with the following error [2025-02-04T11:14:03.488Z] Error: cannot create schema: Schema 'xyz_sv_cd_sch' already exists |
|
Application teams has already created these schemas and volumes using a notebook in the Dev environment, but not yet in Test and Prod. so we are trying to observe the behavior in the dev environment using DAB deployment when the Schema and volume already exists. But the deployment failed. |

This example demonstrates how a job can write a file to a Unity Catalog volume.